'Minority Report' Is Real — And It's Really Reporting Minorities
Every morning, officers across the nation get a predictive map that tells them where crime is likely to happen before it's committed. When the mapping systems used by Detroit Police Department started showing an increase of auto theft and larceny downtown, Detroit's 1st Precinct got together and laid a trap.
The thieves were two local restaurant workers, casing cars on their smoke breaks and cracking windows. Thanks to the predictive maps, the precinct knew the rough locations and the times the suspects would be out. They even knew the make and model of the cars the perps would target, Officer Steve Shank told Mic. So the officers set out bait cars wired with cameras, sensors and GPS tracking devices. In the summer, the maps started to turn up bike theft, and the precinct set up bait bikes using the same techniques. In both cases, the criminals were caught. Crime was reduced. The police officers could watch the hotspots fade on the predictive map's interface.
In the film and now TV show Minority Report, a future police force uses psychics, called "precogs," to predict crime and identify criminals. But in the 21st century, we don't have clairvoyants or sci-fi magic. We have data projections, digital histories and social media profiles.
Next up:
Stats are stats. An algorithm can't be racist, right?
How today's pre-crime works
Today's crime-prediction tech doesn't identify suspects, like in Hollywood's fictional pre-crime units. Instead, traditional digital policing uses maps that are historical, then layers it with on-the-ground knowledge to provide custom insights about how to change the behavior of police on their beats.
One example of popular mapping software is ESRI's ArcMap, which takes crime data — historical crime reports, call-ins, officer interactions, whatever else can be given a time and place — and makes a historical map. In Dallas, where precincts are so large that they're like cities unto themselves, crime analysts layer digital crime maps with locations of community stakeholders, census data, demographics, known gang activity and any other kind of data they can.
"We engage the community, collect surveys, work with neighborhood associations and churches to set up crime watch groups, and encourage them to call if there's suspicious activity," Sgt. Steve Armon, head of operational technology for the Dallas Police Department, told Mic. "We're not sending an officer there and tell them, 'Stay and don't leave.'"
But then there are the maps produced by companies like PredPol, one of the leading predictive mapping companies, which claim to predict crime ahead of time like it's an oncoming storm. These systems show colored "hotspots" that say where crime is likely to occur and when. Then police are dispatched for extra time to those locations. There's no additional recommendations for police behavior but to reduce the likelihood of crime simply by giving those areas extra patrol.
"It's not about getting perpetrators or arresting more people," Larry Samuels, CEO of PredPol, told Mic. "It's about getting ahead of crime. And if you reduce the amount of crime in a community, you change the mindset of the community."
PredPol only uses historical crime data that takes in reported crime. The software accounts for where crime happened, when and what kind of crime it was.
Crime analysts and police departments tend to return the same caveat, over and over, when discussing the software: that the new, predictive maps just repackage old intelligence, telling officers and chiefs that yes, there will be crime in neighborhoods where there has been crime.
"A lot of times, these companies can't tell us things we don't already know," Armon told Mic. "We have our own crime analysis group, and we already apply the same principles they're using."
So why would anyone use it? With the sheen of a new selling point — the idea of futuristic pre-crime prediction, based solely on data — comes the promise of a policing revolution: the elimination of racism and prejudice in police practice.
Darrin Lipsomb, senior director of Hitachi public safety and visualization, told Mic about the new crime-mapping tool Hitachi is rolling out to over a dozen unnamed police departments this month. "It doesn't look at race," he said. "It just looks at facts."
The big problem
Predictive crime data could help eradicate racial profiling in policing if the data were clean of racial prejudice. Unfortunately, that data is generated based on systemic police practices that have marginalized ethnic minorities in this country for decades.
Systems that rely on historical crime data, by their nature, will give results reflective of traditional police practice: The biases in the data are baked into the human practices that generated the data in the first place. When it comes to drug crime, for example, a higher proportion of white Americans have used drugs compared to African-Americans, but African-Americans — largely because of traditional police practices — are the ones incarcerated at disproportionately high rates.
"The academic community suggests that crime, including serious violent crime, is reported to the police about 50% of the time," Christopher Herrmann, a professor at the John Jay School of Criminal Justice, told Mic. "That means, at best, these predictive software programs are beginning their predictions with only half of the picture."
Many of the crimes that go unreported are crimes where the victim doesn't feel like the police will adequately support them, like sexual assault and hate crimes. Diverting police resources toward hotspots identified by these maps means more attention is paid to types of crime already well-covered by police officers.
Cathy O'Neil is a mathematician whose upcoming book, Weapons of Math Destruction, explores how big data can amplify prejudice. O'Neil says that algorithmic models can only amplify and expose insights based on the human behavior that created the data in the first place.
"If you have a white suburb, you probably have very little data on that map," O'Neil told Mic. "But if you knew better, in almost every basement that has a teenager, there could be drug crime experimentation going on. The question isn't 'Are those events happening?' — it's 'Who's exposed?'"
"The question isn't 'Are those events happening?' It's 'Who's exposed?'"
The data used by systems like PredPol don't contain demographic information about residents, and the tools cannot be exploited willfully in order to target individuals. But because the data relies on metrics generated by traditional policing, the methods that create that data in the first place are unchanged or, at worst, amplified.
"The underlying mathematical model isn't biased, but the data going into it is biased," O'Neil told Mic. "This mirage of objectivity is just a myth."
The $23 million industry
Regardless of the issues, police are surging ahead with predictive tech. Business is booming. Federal grant money is flowing and the private sector is swooping in to take advantage of the hype. Besides PredPol, Hitachi and independent startups like Hunchlab and Mark43, companies with global data interests are starting to add policing to their suites of services.
In the past two years, massive corporations acquired three of the major predictive policing startups. Motorola bought PublicEngines (which advertised more than 2,000 customers at time of acquisition), Trimble bought the Omega Group and public records service LexisNexis bought BAIR Analytics.
The services are costly. For smaller precincts, sources told Mic, prices can range from $30,000 to $50,000 to install the software. Then there are yearly "maintenance" costs of further thousands. The city of Alhambra, California, with a population of about 85,000, purchased PredPol's software for $27,500; Seattle paid $135,000 for the same service at a discount. In larger precincts for California cities like Oakland and San Jose, cities are budgeting as much as $160,000 for Predpol installations.
The bigger the surveilled population, the more a company like PredPol charges, Samuels, the CEO, told Mic. He's been asked how much PredPol would cost for a "small nation-state," and he guessed it would be in the millions.
The bigger the surveilled population, the more a company like PredPol charges.
Herrmann, a former crime analysis supervisor with the New York Police Department, has spent the past 10 years on the research side of predictive policing. He describes the sudden boom in businesses as a private-sector gold rush, while police departments are seeing more and more pressure to use modern tools to reduce costs.
"A city council says, 'How come you're not using the latest and greatest stuff?'" Herrmann told Mic. "And if the commissioner or chief thinks it's a waste of money, they'll look like a nonprogressive commissioner."
Many of these systems are being paid for by grants from the National Institute of Justice, the research arm of the U.S. Justice Department. The NIJ has been giving millions in grant money to local police departments experimenting with predictive policing, so police departments don't have to be as prudent when taking on private contracts. Since 2003, they've awarded nearly $24 million in grants across dozens of programs for crime mapping and geospatial analysis.
"That money's from a federal grant, so the city doesn't have a financial stake," Rachel Levinson-Waldman, senior counsel for the Brennan Center for Justice, told Mic. "Then there isn't that much local oversight, because they didn't spend that much."
The pressure by taxpayers and legislators to keep the police force modern and innovative can lead to departments reporting regular and effective use of the tools even if they aren't part of a police department's everyday work, according to Herrmann.
What these tools can't do
Police departments are spending millions on systems that may not be able to improve on human intelligence. A common criticism of programs like PredPol is that the tools might be sexy and buzzworthy, but that they simply make more visible the insights police already have. One criminologist we spoke to called it "old wine in new bottles."
The sales pitch for these companies is that by reducing crime, they end up saving cities much more money than they cost. "If you reduce the crime rate in a city like LA by 10%, the economic benefit is in the billions of dollars," Samuels told Mic. "From an economic perspective, it's insanely invaluable."
PredPol's Samuels says that crime reduction percentage averages around 20%, but a hard look at the numbers self-reported by PredPol don't paint a clear enough portrait of PredPol's alleged success. An analysis of the data by Techdirt shows how PredPol routinely cherry-picks data for its exhaustive roster of success stories — a dissonance augmented by the fact that PredPol requires many police departments to participate in marketing drives and PR initiatives.
Not only is the jury very much out on the efficacy of crime mapping, but there's an enormous lack of transparency. Most of these predictive mapping systems are black box; the algorithms themselves can't be reviewed. Data goes in one end, maps come out the other, and don't ask too many questions about what happens along the way.
Between Mic's reporting, consultations with experts in the field and prior reports on these systems, no one was able to produce a holistic, independent study of the long-term effectiveness of each of these products relative to one another.
Police need better information.
Knowing where crime is happening is just the beginning. Besides improving police practices, sometimes the geography doesn't matter when determining how to address certain types of crime.
"Trying to predict future crimes based on hotspots just tends to show areas of high crime," Deputy Chief Jonathan Lewin, chief technology office of the Chicago Police Department, told Mic. "But we found with gang violence that it's less important where the crime is committed, but who they're going to target."
Even when hotspots are valuable, they don't expose underlying causes of crime. In Memphis, Tennessee, in the early 2000s, crime maps like the ones we have today were showing a strange spread of crime throughout the city, with hotspots popping up in unexpected places. But it took the close analytics of a local criminologist, Richard Janikowski, to determine why.
Janikowski's wife was a housing expert for the local university, and they were able to trace the spread of crime into uncommon neighborhoods to the displacement of lower-income families from the abrupt tearing-down of city housing projects.
Pointing to a map and saying "crime is here" is not a sufficient insight to justify spending hundreds of thousands of dollars on a system when an experienced police officer who has been working his beat for years already knows what volume of crime to expect from one neighborhood or another.
"If the main benefit of a predictive policing algorithm is identifying target areas, I'm skeptical of departments allocating resources and capital," Eric Piza, a professor at the John Jay School of Criminal Justice, told Mic. "Most police departments already do a really good job of figuring out where crime concentrates."
Piza, who was formerly the geographical informations systems specialist for Newark, New Jersey, knows that you can apply human rigor to look beyond the cold, geospatial maps created by pre-crime mapping systems. He's done it, and he's done it for free. In Newark, he helped the department take a look at known criminogenic insights — commonly held notions that a factor or piece of info commonly leads to crime — and examine them based on data.
"One was the home addresses of parolees, specifically for violent crime or narcotics," Piza told Mic. "Our analysis found that there's no statistical support for that observation. In no instance was the home address of these parolees found significant in predicting crime."
Piza was using an RTM Diagnostics tool he helped build with researchers from Rutgers University ("RTM" stands for "risk terrain modeling"). The tool is available to download online, complete with educational seminars and personal support, for only $650 — a far cry from the price tag of a private, outsourced map that would cost a city like Newark a six-figure sum. Chicago has worked with RTM Diagnostics, and Atlantic City, New Jersey, started using it in October.
"This money is always something that can be spent training officers in different kinds of community policing, social services, job training," Levinson-Waldman of the Brennan Center for Justice told Mic. "Even very effective predictive policing isn't a magic bullet."
But that's precisely how the contemporary pre-crime systems are marketed: with the Silicon Valley gloss that says that with increased efficiency and allocation of resources we can solve society's direst issues, like prejudicial police practices.
No software is a worthy substitute for a crime analyst.
"As we're speaking, I'm getting emails from vendors, every hour of every day." Chicago's Lewin told Mic when asked about how often private companies pitch him on a new predictive policing system. "These vendors all get very excited when they think of big departments, because they think there are unlimited budgets."
Yet the work of counteracting systemic prejudice isn't simply making that system more efficient in the use of its resources. These systems, after all, don't tell police what to do once they get to these hotspots.
No algorithm is a worthy substitute for a human police officer or crime analyst. In downtown Detroit, it used to be that Shank, the police officer, could set watch by sports games and concerts, reliable spikes in crime anywhere in the country. But the rehabilitation of downtown Detroit means an evolution in the face of crime — new residences and local business means increases of homelessness and larceny as the city faces a financial crisis.
Shank is keeping his eye on the maps and reports, which he says are still invaluable, but is spending his best time having coffee with local business owners and organizing trips to the movie theater for senior citizens. His most valuable analytics come from phone calls from trusting locals and attending community meetings.
An algorithm is no match for old-school policing.
"You have to be careful that the technology doesn't prevent you from doing the old-school policing," Shank told Mic. "Good officers are going to know who the people you are in your area who get out of their squad car. For most departments that are under siege, it's because they've gone away from getting to know their communities."
The worst-case scenario is that police are left off the hook for discriminatory practices. One of the benefits of claiming access to the latest, glitziest crime technology is being able to off-load responsibility of changing police practices to be more sensitive.
"It gives the police a way out: 'Oh, It's not my cops that are profiling, we're going based on the data,'" said Herrmann, the former crime analysis supervisor for the NYPD. "Because it's data-driven, it can exonerate the police. But the software is going to point them to the same places they're already policing.
Malkia Cyril is the executive director of the Center for Media Justice, a grassroots network of activist organizations concerned with racial inequality as it relates to tech — how technology is often touted as a tool for democracy and justice, but can, in practice, be a tool for the redistribution of power and wealth.
Cyril knows that there are hotspots for her hometown of Oakland, California. She wants to see them, and she's not the only one in her community. But a lack of transparency means a lack of equity to Cyril.
"The data itself doesn't remove the bias, it only exacerbates it, and reproduces the inequality that gave you the data in the first place," she said.
"One hundred percent of the time, the suggested intervention at a hotspot is more police, and there's a lack of imagination or interest in any other possible approach," Cyril told Mic. "It gives police a one-dimensional view of what's happening in a community."
She says that in other hands, predictive maps could be a more empathetic, effective guideline for other kinds of services: social services, community assistance, job training — things that are proven to reduce crime in a community without police intervention.
"People keep turning to technology like it's a savior," Cyril said. "But you can't insert technology into inequality and expect it not to produce inequality."
Instead, startups are exploring new frontiers of what they can add into the mix to get a competitive edge over the other monied startups, like social media and biometric data. It's the kinds of rat race that make even well-protected, upper-middle-class Americans nervous about a potential dystopian surveillance state of the future. But if you ask the people for whom risk of arrest is an everyday reality, this isn't science fiction.
"The future is now," Cyril told Mic. "If you're a person of color, or you've been convicted of a crime, or you've been on public benefits, the future is now."